Simple Reparameterization to Improve Convergence in Linear Mixed Models
نویسندگان
چکیده
1 Univ. of Ljubljana, Biotechnical Fac., Dept. of Animal Science, Groblje 3, SI-1230 Domžale, Slovenia,, Ph.D., e-mail: [email protected] 2 The same address as 1 3 Departamento de Mejora Genética, Instituto Nacional de Investigación Agraria, Carretera de La Coruña, km 7, 28040 Madrid, Spain, Ph.D. Simple reparameterization to improve convergence in linear mixed models Slow convergence and mixing are one of the main problems of Markov chain Monte Carlo (McMC) algorithms applied to mixed models in animal breeding. Poor convergence is to a large extent caused by high posterior correlation between variance components and solutions for the levels of associated effects. A simple reparameterization of the conventional model for variance component estimation is presented which improves McMC sampling and provides the same posterior distributions as the conventional model. Reparameterization is based on the rescaling of hierarchical (random) effects in a model, which alleviates posterior correlation. The developed model is compared against the conventional model using several simulated data sets. Results show that presented reparameterization has better behaviour of associated sampling methods and is several times more efficient for the low values of heritability.
منابع مشابه
Comparison results on the preconditioned mixed-type splitting iterative method for M-matrix linear systems
Consider the linear system Ax=b where the coefficient matrix A is an M-matrix. In the present work, it is proved that the rate of convergence of the Gauss-Seidel method is faster than the mixed-type splitting and AOR (SOR) iterative methods for solving M-matrix linear systems. Furthermore, we improve the rate of convergence of the mixed-type splitting iterative method by applying a preconditio...
متن کاملLocation Reparameterization and Default Priors for Statistical Analysis
This paper develops default priors for Bayesian analysis that reproduce familiar frequentist and Bayesian analyses for models that are exponential or location. For the vector parameter case there is an information adjustment that avoids the Bayesian marginalization paradoxes and properly targets the prior on the parameter of interest thus adjusting for any complicating nonlinearity the details ...
متن کاملWeight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks
We present weight normalization: a reparameterization of the weight vectors in a neural network that decouples the length of those weight vectors from their direction. By reparameterizing the weights in this way we improve the conditioning of the optimization problem and we speed up convergence of stochastic gradient descent. Our reparameterization is inspired by batch normalization but does no...
متن کاملUsing Redundant Parameterizations to Fit Hierarchical Models
Hierarchical linear and generalized linear models can be fit using Gibbs samplers and Metropolis algorithms; these models, however, often have many parameters, and convergence of the seemingly most natural Gibbs and Metropolis algorithms can sometimes be slow. We examine solutions that involve reparameterization and overparameterization. We begin with parameter expansion using working parameter...
متن کاملThe Reparameterization of Linear Models Subject to Exact Linear Restrictions
The estimation of regression models subject to exact linear restrictions, is a widely applied technique, however, aside from simple examples, the reparameterization method is rarely employed except in the case of polynomial lags. We believe this is due to the lack of a general transformation method for changing from the definition of restrictions in terms of the unrestricted parameters to the e...
متن کامل